-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[HOLD] Adding method to sample "best" frames from a video #37
base: develop
Are you sure you want to change the base?
Conversation
Cool. I'll make an effort to look later. Any thoughts on how to substantiate the utility of this angle? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
target_num_frames
isn't quite working right. I tried to run
fob.sample_best_video_frames(
"~/fiftyone/quickstart-video/data/0587e1cfc2344523922652d8b227fba4-000014-video_052.mp4",
"test_video_frames/",
target_num_frames=4
)
It returned 6 frames instead of 4.
Oops good catch, there was a parentheses bug that has been there since THETA (just fixed it there too) that was always causing the first and last frames to be sampled. Now it works as expected! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! Works great.
To answer your question about how to let a user get lower quality frames, one way to do it may be to let them select the percentage of frames that are selected by random
and all others by high quality
.
Hmm that could an option, yes. I was also thinking about allowing the user to specify a quality percentile, and the algorithm would sample the frame w/ quality at the given percentile within each window. The current behavior would correspond to 100 percentile. |
Ahh I think your suggestion is better. My suggestion would provide varying quality frames WITHIN a video. But it makes more sense to do what you said and let the user make videos where the quality varies between them but is similar for all frames within the same video. |
Ports over work from https://github.com/voxel51/theta on adaptively sampling frames from video.
Specifically, provides a
fiftyone.brain.sample_best_video_frames()
method in the public brain interface with the following contract:This is a precursor to another method that will sample frames dynamically across a collection videos.
Another feature that I want to add is to control how to sample frames within each equal-motion bin. Currently the highest quality frame is always selected. But a representative dataset for training needs to also include lower quality images. Thinking about the best way to expose this. Maybe the user gets to select between
high quality
,low quality
, andrandom
?